Open Source. Brainwaves. Code. Let’s go.
OpenBCI (Open-source Brain-Computer Interface) is a platform that lets anyone collect and analyze biosignals like
- EEG (brainwaves)
- EMG (muscle activity),
- ECG (heart signals) — all in open source.
Started by Joel Murphy and Conor Russomanno from MIT Media Lab in 2013,
OpenBCI was successfully crowdfunded on Kickstarter and has since evolved into one of the most accessible BCI platforms in the world.
Website: openbci.com
GitHub: github.com/OpenBCI
Traditionally, EEG and biosignal analysis required expensive medical-grade equipment.
But researchers, artists, hackers, and developers needed a cheaper, hackable alternative.
So, OpenBCI emerged to make brain-computer interfacing accessible to everyone — not just labs.
Core Values:
✅ Open-source hardware & software
✅ Arduino-compatible & customizable
✅ Developer and maker-friendly community
✅ Affordable (just 10–20% the cost of medical EEGs)
Today, OpenBCI is used in neurofeedback, psychology experiments, BCI games, and even brain-controlled robots.
🔌 Hardware
Cyton Board (8 channels)
– The main EEG board that’s Arduino-compatible.
Daisy Module
– Expands Cyton to 16 channels.
Ganglion Board (4 channels)
– Entry-level board for simple experiments.
Ultracortex Headset
– A 3D-printed EEG headset that fits electrode sensors onto your head.
WiFi Shield
– Enables faster streaming over WiFi (instead of Bluetooth).
Note: "Channel" means number of brain areas measured simultaneously. 8-channel = 8 locations.
💻 Software
OpenBCI GUI
– Visualize, record, and analyze brainwave data.
- BrainFlow SDK
A powerful cross-platform SDK (Python, C++, JS) to control boards and read data.
- 3rd-party integration
Works with OpenViBE, BCILAB, and more.
OpenBCI isn’t just about collecting data — it’s about creating new interactions between brain and machine. Some ideas:
Focus Detection
– Visualize concentration in real time based on beta wave activity
Neurofeedback Apps
– Feedback loops that change visuals based on relaxation or focus
Brain-Controlled Games
– Move a game character by focusing or relaxing
Drone Control
– Map brain patterns to control drone directions
VR/AR Integration
– Make immersive experiences that react to your mental state
Here’s a simple example using Python & BrainFlow SDK:
import argparse
import time
from brainflow.board_shim import BoardShim, BrainFlowInputParams, BoardIds
def main():
BoardShim.enable_dev_board_logger()
parser = argparse.ArgumentParser()
# use docs to check which parameters are required for specific board, e.g. for Cyton - set serial port
parser.add_argument('--timeout', type=int, help='timeout for device discovery or connection', required=False,
default=0)
parser.add_argument('--ip-port', type=int, help='ip port', required=False, default=0)
parser.add_argument('--ip-protocol', type=int, help='ip protocol, check IpProtocolType enum', required=False,
default=0)
parser.add_argument('--ip-address', type=str, help='ip address', required=False, default='')
parser.add_argument('--serial-port', type=str, help='serial port', required=False, default='')
parser.add_argument('--mac-address', type=str, help='mac address', required=False, default='')
parser.add_argument('--other-info', type=str, help='other info', required=False, default='')
parser.add_argument('--serial-number', type=str, help='serial number', required=False, default='')
parser.add_argument('--board-id', type=int, help='board id, check docs to get a list of supported boards',
required=True)
parser.add_argument('--file', type=str, help='file', required=False, default='')
parser.add_argument('--master-board', type=int, help='master board id for streaming and playback boards',
required=False, default=BoardIds.NO_BOARD)
args = parser.parse_args()
params = BrainFlowInputParams()
params.ip_port = args.ip_port
params.serial_port = args.serial_port
params.mac_address = args.mac_address
params.other_info = args.other_info
params.serial_number = args.serial_number
params.ip_address = args.ip_address
params.ip_protocol = args.ip_protocol
params.timeout = args.timeout
params.file = args.file
params.master_board = args.master_board
board = BoardShim(args.board_id, params)
board.prepare_session()
board.start_stream ()
time.sleep(10)
# data = board.get_current_board_data (256) # get latest 256 packages or less, doesnt remove them from internal buffer
data = board.get_board_data() # get all data and remove it from internal buffer
board.stop_stream()
board.release_session()
print(data)
if __name__ == "__main__":
main()
Want to see real-time brain activity in your terminal? This is all you need to get started.
No technology is perfect — especially when it’s trying to read the human brain from outside your skull.
❌ Current Limitations:
Not as precise as medical-grade EEGs
Signal noise from muscle movement or interference
Setup is slightly bulky (headset, wires, gels)
High individual variability – training is often required
✅ But the Future Looks Bright:
Non-invasive BCI is safer and more accessible than brain implants
Machine learning continues to improve signal classification
New wearable-friendly designs are emerging for daily use
Communities are constantly pushing the boundaries (just look at their Discord)
Whether you're a student, developer, artist, or just curious
— OpenBCI gives you the tools to communicate with the brain in real time.
The barriers are low, the code is open, and the community is thriving.
We’re not just reading brainwaves anymore. We’re building with them.
So what will you create with OpenBCI?